Decentralized Consensus Algorithm with Delayed and Stochastic Gradients
نویسندگان
چکیده
منابع مشابه
Decentralized Consensus Algorithm with Delayed and Stochastic Gradients
Abstract. We analyze the convergence of a decentralized consensus algorithm with delayed gradient information across the network. The nodes in the network privately hold parts of the objective function and collaboratively solve for the consensus optimal solution of the total objective while they can only communicate with their immediate neighbors. In real-world networks, it is often difficult a...
متن کاملDSA: Decentralized Double Stochastic Averaging Gradient Algorithm
This paper considers convex optimization problems where nodes of a network have access to summands of a global objective. Each of these local objectives is further assumed to be an average of a finite set of functions. The motivation for this setup is to solve large scale machine learning problems where elements of the training set are distributed to multiple computational elements. The decentr...
متن کاملDecentralized Consensus Optimization with Asynchrony and Delay
We propose an asynchronous, decentralized algorithm for consensus optimization. The algorithm runs in a network of agents, where the agents perform local computation and communicate with neighbors. We design our algorithm so that the agents can compute and communicate independently, at different times, for different durations. This reduces the waiting time for the slowest agent or longest commu...
متن کاملThe Decrits Consensus Algorithm: Decentralized Agreement without Proof of Work
Decrits is a cryptocurrency in development that makes use of a novel consensus algorithm that does not require proof-of-work. This paper describes how the Decrits Consensus Algorithm (DCA) is as trustless as a proof-of-work algorithm while offering superior transaction security at virtually no cost. Cryptocurrency: A digital construction of classical money that is protected from duplication and...
متن کاملEscaping Saddles with Stochastic Gradients
We analyze the variance of stochastic gradients along negative curvature directions in certain nonconvex machine learning models and show that stochastic gradients exhibit a strong component along these directions. Furthermore, we show that contrary to the case of isotropic noise this variance is proportional to the magnitude of the corresponding eigenvalues and not decreasing in the dimensiona...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM Journal on Optimization
سال: 2018
ISSN: 1052-6234,1095-7189
DOI: 10.1137/16m1081257